Finding Representations for Memory-Based Language Learning
نویسنده
چکیده
Constructive induction transforms the representation of instances in order to produce a more accurate model of the concept to be learned. For this purpose, a variety of operators has been proposed in the literature, including a Cartesian product operator forming pairwise higher-order attributes. We study the effect of the Cartesian product operator on memory-based language learning, and demonstrate its effect on generalization accuracy and data compression for a number of linguistic classification tasks, using k-nearest neighbor learning algorithms. These results are compared to a baseline approach of backward sequential elimination of attributes. It is demonstrated that neither approach consistently outperforms the other, and that attribute elimination can be used to derive compact representations for memory-based language learning without noticeable loss of generalization accuracy. Introduct ion Learning
منابع مشابه
Word Type Effects on L2 Word Retrieval and Learning: Homonym versus Synonym Vocabulary Instruction
The purpose of this study was twofold: (a) to assess the retention of two word types (synonyms and homonyms) in the short term memory, and (b) to investigate the effect of these word types on word learning by asking learners to learn their Persian meanings. A total of 73 Iranian language learners studying English translation participated in the study. For the first purpose, 36 freshmen from an ...
متن کاملNamed Entity Recognition in Persian Text using Deep Learning
Named entities recognition is a fundamental task in the field of natural language processing. It is also known as a subset of information extraction. The process of recognizing named entities aims at finding proper nouns in the text and classifying them into predetermined classes such as names of people, organizations, and places. In this paper, we propose a named entity recognizer which benefi...
متن کاملMental Representations of Lyrical Prose
The article analyzes mental representations of Russian lyrical prose texts. The texts demonstrate collective memory engrams that are defined by cultural and historical legacy of the nation and authors’ creative world perception. In architectonics of a lyrical prose text, sense perception reveals itself in accumulated underlying meanings and wisdom conveyed by expressive means. The author’s inte...
متن کاملFrustratingly Short Attention Spans in Neural Language Modeling
Neural language models predict the next token using a latent representation of the immediate token history. Recently, various methods for augmenting neural language models with an attention mechanism over a differentiable memory have been proposed. For predicting the next token, these models query information from a memory of the recent history which can facilitate learning midand long-range de...
متن کاملTowards an Inquiry-Based Language Learning: Can a Wiki Help?
Wiki use may help EFL instructors to create an effective learning environment for inquiry-based language teaching and learning. The purpose of this study was to investigate the effects of wikis on the EFL learners’ IBL process. Forty-nine EFL students participated in the study while they conducted research projects in English. The Non-wiki group (n = 25) received traditional inquiry instr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999